negative stereotype
Evaluating Large Language Models through Gender and Racial Stereotypes
Language Models have ushered a new age of AI gaining traction within the NLP community as well as amongst the general population. AI's ability to make predictions, generations and its applications in sensitive decision-making scenarios, makes it even more important to study these models for possible biases that may exist and that can be exaggerated. We conduct a quality comparative study and establish a framework to evaluate language models under the premise of two kinds of biases: gender and race, in a professional setting. We find out that while gender bias has reduced immensely in newer models, as compared to older ones, racial bias still exists.
- North America > United States > Georgia > Fulton County > Atlanta (0.04)
- Europe > Norway (0.04)
- Asia > Pakistan (0.04)
Revealed: What the average people in 13 UK counties look like, according to AI - so do YOU agree?
The UK is home to 92 counties, each with its own distinctive look and feel. Now, a film editor has tasked artificial intelligence (AI) with putting faces to these counties - with hilarious results. Duncan Thomsen, 53, used the software Midjourney to create images of'average people' in 13 counties. The results suggest that the average residents in County Antrim are young with red hair, while people living in Anglesey are elderly (and wrapped up for the cold weather!). So, do you agree with what AI thinks the average people look like in your county?
- Europe > United Kingdom > Northern Ireland > County Antrim (0.27)
- Europe > United Kingdom > England > Tyne and Wear (0.18)
- Europe > United Kingdom > England > Oxfordshire (0.08)
- (3 more...)
MailOnline asks ChatGPT to come up with a stereotype for residents in all UK counties
ChatGPT has revealed some scathing stereotypes of UK residents in a merciless study of what clichés exist in every county. The cutting-edge bot labeled Yorkshiremen as'rude' while Londoners were slammed for their arrogance in the nationwide analysis. The truly insulting results came after MailOnline asked ChatGPT to expose what'negative stereotypes' exist of people from our nation. While the bot insisted that it did not condone stereotypes, it offered a list of those associated with each place when prompted. On the whole, residents of the UK were deemed to have bad teeth while being overly polite and obsessed with the Royal Family.
- Europe > United Kingdom > England > Nottinghamshire (0.07)
- Europe > United Kingdom > Scotland > Aberdeenshire (0.07)
- Europe > United Kingdom > England > Cambridgeshire (0.07)
- (3 more...)
Here is what ChatGPT thinks of people in every US state
ChatGPT has been accused of being woke and shying away from offensive feedback -- but not when it comes to negative stereotypes about Americans. ChatGPT stated that people in Alabama are'hillbillies', Idahoans are'gun-touting survivalists', Wisconsinites are'heavy drinkers' and people in Iowa are just plain'boring'. When it came to the most populous states, the AI said New Yorkers are rude, Californians are superficial, Texans are pro-gun, Floridians are crazy and Pennsylvanians are unwelcoming to outsiders. However, not all stereotypes were offensive. The bot described Ohioans as down-to-earth, New Mexicans as spiritual, residents in Oregon as hipsters and Nebraskans as friendly.
- North America > United States > New York (0.28)
- North America > United States > Oregon (0.25)
- North America > United States > Iowa (0.25)
- (8 more...)
Words of Wisdom: Representational Harms in Learning From AI Communication
Buddemeyer, Amanda, Walker, Erin, Alikhani, Malihe
Many educational technologies use artificial intelligence (AI) that presents generated or produced language to the learner. We contend that all language, including all AI communication, encodes information about the identity of the human or humans who contributed to crafting the language. With AI communication, however, the user may index identity information that does not match the source. This can lead to representational harms if language associated with one cultural group is presented as "standard" or "neutral", if the language advantages one group over another, or if the language reinforces negative stereotypes. In this work, we discuss a case study using a Visual Question Generation (VQG) task involving gathering crowdsourced data from targeted demographic groups. Generated questions will be presented to human evaluators to understand how they index the identity behind the language, whether and how they perceive any representational harms, and how they would ideally address any such harms caused by AI communication. We reflect on the educational applications of this work as well as the implications for equality, diversity, and inclusion (EDI).
- North America > United States > Pennsylvania > Allegheny County > Pittsburgh (0.04)
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.04)
- North America > United States > New York (0.04)
- (5 more...)
- Instructional Material (0.66)
- Research Report (0.50)
Louisville gamer startup is changing the negative stereotypes around video games
A link has been posted to your Facebook feed. LOUISVILLE – The culture surrounding video games is often shrouded in stereotypes and negative connotations. How often have we heard the narrative, especially following mass shootings in America, that video games are linked to violent behavior. Or that people who play video games are "basement dwellers" with no life. Or even the idea that all video game developers are Silicon Valley tech bros and it's a "man's world."
- Information Technology > Communications > Social Media (1.00)
- Information Technology > Artificial Intelligence > Games (1.00)
Being smarter means you are more likely to use stereotypes
People with higher cognitive abilities are often better able to spot patterns in the world around them, allowing them to excel in a wide range of tasks, from learning languages to recognizing faces. But, in some situations, even being intelligent has its drawbacks. A new study has found that these people are more likely to stereotype others based on the patterns they detect, potentially leading to negative consequences as they perpetuate social biases. A new study found that people with higher cognitive abilities are more likely to stereotype others. In the study, the researchers manipulated image-description pairings so that the faces with particular features were linked to negative stereotypes.
When bias in product design means life or death
Carol E. Reiley is the co-founder and president of Drive.ai. She previously founded Tinkerbelle Laboratories. During my Ph.D. studies, I developed a voice-activated human-robot interface for a surgical robotic system using Microsoft's speech recognition API. But, because the API had been built mainly by 20-30-year-old men, it did not recognize my voice. I had to lower my pitch in order for it to work.
- Information Technology (0.73)
- Automobiles & Trucks (0.50)
- Transportation > Ground > Road (0.32)
When bias in product design means life or death
Carol E. Reiley is the co-founder and president of Drive.ai. She previously founded Tinkerbelle Laboratories. During my Ph.D. studies, I developed a voice-activated human-robot interface for a surgical robotic system using Microsoft's speech recognition API. But, because the API had been built mainly by 20-30-year-old men, it did not recognize my voice. I had to lower my pitch in order for it to work.
- Information Technology (0.73)
- Automobiles & Trucks (0.50)
- Transportation > Ground > Road (0.32)